102 research outputs found

    In-situ and remote monitoring of environmental water quality

    Get PDF
    Environmental water pollution affects human health and reduces the quality of our natural water ecosystems and resources. As a result, there is great interest in monitoring water quality and ensuring that all areas are compliant with legislation. Ubiquitous water quality monitoring places considerable demands upon existing sensing technology. The combined challenges of system longevity, autonomous operation, robustness, large-scale sensor networks, operationally difficult deployments and unpredictable and lossy environments collectively represents a technological barrier that has yet to be overcome[1]. Ubiquitous sensing envisages many aspects of our environment being routinely sensed. This will result in data streams from a large variety of heterogeneous sources, which will often vary in their volume and accuracy. The challenge is to develop a networked sensing infrastructure that can support the effective capture, filtering, aggregation and analysis of such data. This will ultimately enable us to dynamically monitor and track the quality of our environment at multiple locations. Microfluidic technology provides a route to the development of miniaturised analytical instruments that could be deployed remotely, and operate autonomously over relatively long periods of time (months–years). An example of such a system is the autonomous phosphate sensor[2] which has been developed at the CLARITY Centre, in Dublin City University. This technology, in combination with the availability of low power, reliable wireless communications platforms that can link sensors and analytical devices to online databases and servers, form the basis for extensive networks of autonomous analytical ‘stations’ or ‘nodes’ that will provide high quality information about key chemical parameters that determine the quality of our aquatic environment. The system must also have sufficient intelligence to enable adaptive sampling regimes as well as accurate and efficient decision-making responses. A particularly exciting area of development is the combination of remote satellite/aircraft based monitoring with the in-situ ground-based monitoring described above. Remote observations from satellites and aircraft can provide significant amounts of information on the state of the aquatic environment over large areas. As in-situ deployments of sensor networks become more widespread and reliable, and satellite data becomes more widely available, information from each of these sources can complement and validate the other, leading to an increased ability to rapidly detect potentially harmful events, and to assess the impact of environmental pressures on scales ranging from small river catchments to the open ocean. In this paper, we will assess the current status of these approaches, and the challenges that must be met in order to realise the vision of true internet- or global-scale monitoring of our environment. References: [1] Integration of analytical measurements and wireless communications – Current issues and future strategies. King Tong Lau, Sarah Brady, John Cleary and Dermot Diamond, Talanta 75 (2008) 606–612. [2] An autonomous microfluidic sensor for phosphate: on-site analysis of treated wastewater. John Cleary, Conor Slater, Christina McGraw and Dermot Diamond, IEEE Sensors Journal, 8 (2008) 508-515

    iForgot: a model of forgetting in robotic memories

    Get PDF
    Much effort has focused in recent years on developing more life-like robots. In this paper we propose a model of memory for robots, based on human digital memories, though our model incorporates an element of forgetting to ensure that the robotic memory appears more human and therefore can address some of the challenges for human-robot interaction

    A structural alignment model of noun-noun compound interpretation

    Get PDF
    The interpretation of noun-noun compounds is complex, yet compounds such as 'web surfer' and 'beef baron' are generated and interpreted easily by native English speakers. Concept combination is the core process in the generation and interpretation o f noun-noun compounds. Such compounds may be read literally or metaphorically suggesting that the combination process is capable of both literal and metaphoric interpretations. The motivation for this thesis is to tackle three problems which occur in concept combination. These problems are: (1) compounds are often polysemous, (2) compounds often appear to be understood by evoking a context (or world knowledge) and (3) compounds can be interpreted figuratively. We suggest that adopting structural alignment allows us to deal with each of these problems. Structural alignment is a process whereby conceptual structures are placed into correspondence and similarities are found. The structural alignment model proposed in this thesis suggests that there are six core combination types and that an interpretation of a nounnoun compound will fall into one of these combination types. Some of these combination types are figurative and some rely on finding a context. We provide an implementation of the model, the fNCA system. The INCA system is a program where a user can find interpretations for noun-noun compounds. INCA has a knowledge base and attempts to find fixed patterns in a network representation of concepts. Depending on the type of pattern found, several types of interpretation can be generated. The performance of INCA is compared with that of a number of human subjects in a brief evaluation study. The study shows that combination types proposed by our structural alignment model to offer a good coverage of the interpretations that people generate. Finally we set out proposals for developing INCA further and outline directions for future research

    Image processing for smart browsing of ocean colour data products and subsequent incorporation into a multi-modal sensing framework

    Get PDF
    Ocean colour is defined as the water hue due to the presence of tiny plants containing the pigment chlorophyll, sediments and coloured dissolved organic material and so water colour can provide valuable information on coastal ecosystems. The ‘Ocean Colour project’ collects data from various satellites (e.g. MERIS, MODIS) and makes this data available online. One method of searching the Ocean Colour project data is to visually browse level 1 and level 2 data. Users can search via location (regions), time and data type. They are presented with images which cover chlorophyll, quasi-true colour and sea surface temperature (11 ÎŒ) and links to the source data. However it is often preferable for users to search such a complex and large dataset by event and analyse the distribution of colour in an image before examination of the source data. This will allow users to browse and search ocean colour data more efficiently and to include this information more seamlessly into a framework that incorporates sensor information from a variety of modalities. This paper presents a system for more efficient management and analysis of ocean colour data and suggests how this information can be incorporated into a multi-modal sensing framework for a smarter, more adaptive environmental sensor network

    Image processing for smarter browsing of ocean color data products: investigating algal blooms

    Get PDF
    Remote sensing technology continues to play a significant role in the understanding of our environment and the investigation of the Earth. Ocean color is the water hue due to the presence of tiny plants containing the pigment chlorophyll, sediments, and colored dissolved organic material and so can provide valuable information on coastal ecosystems. We propose to make the browsing of Ocean Color data more efficient for users by using image processing techniques to extract useful information which can be accessible through browser searching. Image processing is applied to chlorophyll and sea surface temperature images. The automatic image processing of the visual level 1 and level 2 data allow us to investigate the occurrence of algal blooms. Images with colors in a certain range (red, orange etc.) are used to address possible algal blooms and allow us to examine the seasonal variation of algal blooms in Europe (around Ireland and in the Baltic Sea). Yearly seasonal variation of algal blooms in Europe based on image processing for smarting browsing of Ocean Color are presented

    Automated Extraction of Fragments of Bayesian Networks from Textual Sources

    Get PDF
    Mining large amounts of unstructured data for extracting meaningful, accurate, and actionable information, is at the core of a variety of research disciplines including computer science, mathematical and statistical modelling, as well as knowledge engineering. In particular, the ability to model complex scenarios based on unstructured datasets is an important step towards an integrated and accurate knowledge extraction approach. This would provide a significant insight in any decision making process driven by Big Data analysis activities. However, there are multiple challenges that need to be fully addressed in order to achieve this, especially when large and unstructured data sets are considered. In this article we propose and analyse a novel method to extract and build fragments of Bayesian networks (BNs) from unstructured large data sources. The results of our analysis show the potential of our approach, and highlight its accuracy and efficiency. More specifically, when compared with existing approaches, our method addresses specific challenges posed by the automated extraction of BNs with extensive applications to unstructured and highly dynamic data sources. The aim of this work is to advance the current state-of-the-art approaches to the automated extraction of BNs from unstructured datasets, which provide a versatile and powerful modelling framework to facilitate knowledge discovery in complex decision scenarios

    A Data Centre Air Flow Model for Predicting Computer Server Inlet Temperatures

    Get PDF
    Data centres account for approx. 1.3% of the world\u27s electricity consumption, of which up to 50% of that power is dedicated to keeping the actual equipment cool. This represents a huge opportunity to reduce data centre energy consumption by tackling the cooling system operations with a focus on thermal management. This work presents a novel Data Centre Air Flow Model (DCAM) for temperature prediction of server inlet temperatures. The model is a physics-based model under-pinned by turbulent jet theory allowing a reduction in the solution domain size by using only local boundary conditions in front of the servers. Current physics-based modeling approaches require a solution domain of the entire data centre room which is expensive in terms of computation even if a small change occurs in a localised area. By limiting the solution domain and boundary conditions to a local level, the model focuses on the airflow mixing that affects temperatures while also simplifying the related computations. The DCAM model does not have the usual complexities of numerical computations, dependencies on computational grid size, meshing or the need to solve a full domain solution. The input boundary conditions required for the model can be supplied by the Building Management System (BMS), Power Distribution Units (PDU), sensors, or output from other modeling environments that only need updating when significant changes occur. Preliminary results validated on a real world data centre yield an overall prediction error of 1.2°C RMSE. The model can perform in real-time, giving way to applications for real-time monitoring, as input to optimise control of air conditioning units, and can complement sensor networks

    NextGen AML: distributed deep learning based language technologies to augment anti money laundering Investigation

    Get PDF
    Most of the current anti money laundering (AML) systems, using handcrafted rules, are heavily reliant on existing structured databases, which are not capable of effectively and efficiently identifying hidden and complex ML activities, especially those with dynamic and timevarying characteristics, resulting in a high percentage of false positives. Therefore, analysts1 are engaged for further investigation which significantly increases human capital cost and processing time. To alleviate these issues, this paper presents a novel framework for the next generation AML by applying and visualizing deep learning-driven natural language processing (NLP) technologies in a distributed and scalable manner to augment AML monitoring and investigation. The proposed distributed framework performs news and tweet sentiment analysis, entity recognition, relation extraction, entity linking and link analysis on different data sources (e.g. news articles and tweets) to provide additional evidence to human investigators for final decisionmaking. Each NLP module is evaluated on a task-specific data set, and the overall experiments are performed on synthetic and real-world datasets. Feedback from AML practitioners suggests that our system can reduce approximately 30% time and cost compared to their previous manual approaches of AML investigation

    Environmental monitoring of Galway Bay: fusing data from remote and in-situ sources

    Get PDF
    Changes in sea surface temperature can be used as an indicator of water quality. In-situ sensors are being used for continuous autonomous monitoring. However these sensors have limited spatial resolution as they are in effect single point sensors. Satellite remote sensing can be used to provide better spatial coverage at good temporal scales. However in-situ sensors have a richer temporal scale for a particular point of interest. Work carried out in Galway Bay has combined data from multiple satellite sources and in-situ sensors and investigated the benefits and drawbacks of using multiple sensing modalities for monitoring a marine location
    • 

    corecore